专利摘要:
According to an example aspect of the present invention, there is provided a personal multi-sensor apparatus (300) comprising a memory (320) configured to store plural sequences of sensor data elements and at least one processing core (310) configured to: derive, from the plural sequences of sensor data elements, plural sensor data segments, each sensor data segment comprising time-aligned sensor data element sub-sequences from at least two of the sequences of sensor data elements, and assign a label to at least some of the sensor data segments based on the sensor data elements comprised in the respective sensor data segments, to obtain a sequence of labels.
公开号:FI20196079A1
申请号:FI20196079
申请日:2019-12-12
公开日:2020-06-22
发明作者:Tuomas Hapola;Mikko Martikka;Timo Eriksson;Erik Lindman
申请人:Amer Sports Digital Services Oy;
IPC主号:
专利说明:

[0001] [0001] The present invention relates to managing user data generated from sensor devices.BACKGROUND
[0002] [0002] User sessions, such as activity sessions, may be recorded, for example in notebooks, spreadsheets or other suitable media. Recorded training sessions enable more — systematic training, and progress toward set goals can be assessed and tracked from the records so produced. Such records may be stored for future reference, for example to assess progress an individual is making as a result of the training. An activity session may comprise a training session or another kind of session.
[0003] [0003] Personal sensor devices, such as, for example, sensor buttons, smart watches, smartphones or smart jewellery, may be configured to produce sensor data for session records. Such recorded sessions may be useful in managing physical training, child safety or in professional uses. Recorded sessions, or more generally sensor-based activity management, may be of varying type, such as, for example, running, walking, skiing, canoeing, wandering, or assisting the elderly.
[0004] [0004] Recorded sessions may be viewed using a personal computer, for example, oO S wherein recordings may be copied from a personal device to the personal computer. Files
[0006] [0006] The invention is defined by the features of the independent claims. Some specific embodiments are defined in the dependent claims.
[0007] [0007] According to a first aspect of the present invention, there is provided a personal multi-sensor apparatus comprising a memory configured to store plural sequences of sensor data elements and at least one processing core configured to: derive, from the — plural sequences of sensor data elements, plural sensor data segments, each sensor data segment comprising time-aligned sensor data element sub-sequences from at least two of the sequences of sensor data elements, and assign a label to at least some of the sensor data segments based on the sensor data elements comprised in the respective sensor data segments, to obtain a sequence of labels.
[0008] [0008] According to a second aspect of the present invention, there is provided a method in a personal multisensor apparatus, comprising storing plural sequences of sensor data elements, deriving, from the plural sequences of sensor data elements, plural sensor data segments, each sensor data segment comprising time-aligned sensor data element sub- sequences from at least two of the sequences of sensor data elements, and assigning a label — to at least some of the sensor data segments based on the sensor data elements comprised in the respective sensor data segments, to obtain a seguence of labels.
[0011] [0011] According to a fifth aspect of the present invention, there is provided a computer program configured to cause a method in accordance with at least one of the second and fourth aspects to be performed.BRIEF DESCRIPTION OF THE DRAWINGS
[0012] [0012] FIGURE 1 illustrates an example system in accordance with at least some —embodiments of the present invention;
[0013] [0013] FIGURE 2A illustrates an example multisensorial time series;
[0014] [0014] FIGURE 2B illustrates a second example multisensorial time series;
[0015] [0015] FIGURE 3 illustrates an example apparatus capable of supporting at least some embodiments of the present invention;
[0016] [0016] FIGURE 4 illustrates signalling in accordance with at least some embodiments of the present invention, and
[0017] [0017] FIGURE 5 is a flow graph of a method in accordance with at least some embodiments of the present invention. o 20 EMBODIMENTSON
[0019] [0019] FIGURE 1 illustrates an example system in accordance with at least some embodiments of the present invention. The system comprises device 110, which may comprise, for example, multi-sensor device, such as, for example, a personal multi-sensor device, such as, for example, a personal biosensor apparatus such as a smart watch, digital watch, sensor button, or another type of suitable device. In general, a biosensor apparatus may comprise a fitness sensor apparatus or a therapy sensor apparatus, for example. In the illustrated example, device 110 is attached to the user’s ankle, but it may equally be otherwise associated with the user, for example by being worn around the wrist. A sensor button is a device comprising a set of sensors and communications interface, configured to produce from each sensor a sequence of sensor data elements. A sensor button may be powered by a battery, or it may gain its energy from movements of the user, for example.
[0020] [0020] The sensors may be configured to measure acceleration, rotation, moisture, pressure and/or other variables, for example. In one specific embodiment, the sensors are — configured to measure acceleration along three mutually orthogonal axes and rotation about three mutually orthogonal axes. The sensors may comprise single-or multi-axis magnetic field sensors, skin signal EMG, ECG, heartbeat and/or optical pulse sensors. Additionally or alternatively, human activity may be sensed via motion or use of sport utensils, tools, machinery and/or devices. In all, such sensors would produce six sequences — of sensor data elements, such that in each sequence the sensor data elements are in chronological order, obtained once per sampling interval. The sampling intervals of the sensors do not need to be the same.
[0021] [0021] Device 110 may be communicatively coupled, directly or indirectly, with a 2 communications network. For example, in FIGURE 1 device 110 is coupled, via wireless N 25 — link 112, with base station 120. Base station 120 may comprise a cellular or non-cellular N base station, wherein a non-cellular base station may be referred to as an access point. = Examples of cellular technologies include wideband code division multiple access, > WCDMA, and long term evolution, LTE, while examples of non-cellular technologies S include wireless local area network, WLAN, and worldwide interoperability for microwave 2 30 access, WIMAX. Base station 120 may be coupled with network node 130 via connection N 123. Connection 123 may be a wire-line connection, for example. Network node 130 may comprise, for example, a controller or gateway device. Network node 130 may interface, via connection 134, with network 140, which may comprise, for example, the Internet or acorporate network. Network 140 may be coupled with further networks via connection
[0022] [0022] Device 110 may be configured to receive, directly or indirectly, from satellite 5 constellation 150, satellite positioning information via satellite link 151. The satellite constellation may comprise, for example the global positioning system, GPS, or the Galileo constellation. Satellite constellation 150 may comprise more than one satellite, although only one satellite is illustrated in FIGURE 1 for the same of clarity. Likewise, receiving the positioning information over satellite link 151 may comprise receiving data — from more than one satellite.
[0023] [0023] Where device 110 is indirectly coupled with the communications network and/or satellite constellation 150, it may be arranged to communicate with a personal device of user 101, such as a smartphone, which has connectivity with the communications network and/or satellite constellation 150. Device 110 may communicate with the personal — device via, for example, a short-range communication technology such as the Bluetooth or Wibree technologies, or, indeed, via a cable. The personal device and device 110 may be considered to form a personal area network, PAN.
[0024] [0024] Alternatively or additionally to receiving data from a satellite constellation, device 110 or the personal device may obtain positioning information by interacting with a network in which base station 120 is comprised. For example, cellular networks may employ various ways to position a device, such as trilateration, multilateration or positioning based on an identity of a base station with which attachment is possible or O ongoing. Likewise a non-cellular base station, or access point, may know its own location N and provide it to device 110 or the personal device, enabling device 110 and/or the 2 25 — personal device to position itself within communication range of this access point. Device - 110 or the personal device may be configured to obtain a current time from satellite = constellation 150, base station 120 or by reguesting it from the user, for example. o S [0025] Device 110 or the personal device may be configured to provide an activity 2 session. An activity session may be associated with an activity type. Examples of activity N 30 types include rowing, paddling, cycling, jogging, walking, hunting, swimming and paragliding. In a simple form, an activity session may comprise storing sensor data produced with sensors comprised in device 110, the personal device or a server, forexample. An activity session may be determined to have started and ended at certain points in time, such that the determination takes place afterward or concurrently with the starting and/or ending. In other words, device 110 may store sensor data to enable subsequent identification of activity sessions based at least partly on the stored sensor data.
[0026] [0026] An activity session may enhance a utility a user can obtain from the activity, for example, where the activity involves movement outdoors, the activity session may provide a recording of the activity session. A recording of an activity session may, in some embodiments, provide the user with contextual information. Such contextual information may comprise, for example, locally relevant weather information, received via base station 120, for example. Such contextual information may comprise at least one of the following: a rain warning, a temperature warning, an indication of time remaining before sunset, an indication of a nearby service that is relevant to the activity, a security warning, an indication of nearby users and an indication of a nearby location where several other users have taken photographs. Contextual information may be presented during an activity — session.
[0027] [0027] A recording of an activity session may comprise information on at least one of the following: a route taken during the activity session, a metabolic rate or metabolic effect of the activity session, a time the activity session lasted, a guantity of energy consumed during the activity session, a sound recording obtained during the activity — session and an elevation map along the length of the route taken during the activity session. A route may be determined based on positioning information, for example. Metabolic effect and consumed energy may be determined, at least partly, based on sensor data obtained from user 101 during the activity session. A recording may be stored in device 2 110, the personal device, or in a server or other cloud data storage service. A recording N 25 stored in a server or cloud may be encrypted prior to transmission to the server or cloud, to N protect privacy of the user. A recording may be produced even if the user has not indicated I an activity session has started, since a beginning and ending of an activity session may be > determined after the session has ended, for example based, at least partly, on sensor data.
[0029] [0029] As described above, the plural sequences of sensor data elements may comprise data from more than one sensor, wherein the more than one sensor may comprise — sensors of at least two distinct types. For example, plural sequences of sensor data elements may comprise sequences of acceleration sensor data elements and rotation sensor data elements. Further examples are sound volume sensor data, moisture sensor data and electromagnetic sensor data. In general, each sequence of sensor data elements may comprise data from one and only one sensor. — [0030] An activity type may be determined based, at least partly, on the sensor data elements. This determining may take place when the activity is occurring, or afterwards, when analysing the sensor data. The activity type may be determined by device 110 or by a server-side computer that has access to the sensor data, for example, or a server that is provided access to the sensor data. Where a server is given access to the sensor data, or, in some embodiments, when activity type detection is performed on device 110 or the personal device, the sensor data may be processed into a seguence of labels.
[0031] [0031] A seguence of labels may characterize the content of sensor data. For O example, where the sensor data elements are numerical values obtained during jogging, a N seguence of labels derived from those sensor data elements may comprise a seguence of = 25 labels: {jog-step, jog-step, jog-step, jog-step, jog-step, ...}. Likewise, where the sensor = data elements are numerical values obtained during a long jump, a sequence of labels E derived from those sensor data elements may comprise a seguence of labels: fsprint-step, O sprint-step, sprint-step, sprint-step, sprint-step, leap, stop}. Likewise, where the sensor data 3 elements are numerical values obtained during a triple jump, a sequence of labels derived > 30 from those sensor data elements may comprise a sequence of labels: {sprint-step, sprint- step, sprint-step, sprint-step, leap, leap, leap, stop}. The sequences of labels are thus usablein identifying the activity type, for example differentiating between long jump and triple jump based on the number of leaps.
[0032] [0032] The labels may be expressed in natural language or as indices to a pre- defined table, which may be dynamically updatable, as new kinds of exercise primitives become known. For example, in the table a jog-step may be represented as 01, a sprint-step (that is, a step in running much faster than jogging) as 02, a leap as 03, and a stopping of motion may be represented as 04. Thus the triple jump would be represented as a sequence of labels {02, 02, 02, 02, 03, 03, 03, 04}. The activity, for example a triple jump, may be detected from the labels, while the sequence of labels takes up significantly less space than — the original sequences of sensor data elements.
[0033] [0033] To process the sequences of sensor data elements into a sequence of labels, sensor data segments may be derived from the sequences of sensor data elements. Each sensor data segment may then be associated with an exercise primitive and assigned a label, to obtain the sequence of labels. Each sensor data segment may comprise time- aligned sensor data element sub-sequences from at least two of the sequences of sensor data elements. In other words, segments of sensor data are derived, each such segment comprising a time slice of original sequences of sensor data elements. This may be conceptualized as time-slicing a multi-sensor data stream captured during jogging into the individual steps that make up the jogging session. Likewise other activity sessions may be — time-sliced into exercise primitives which make up the activity.
[0034] [0034] To derive the segments, device 110 or another device may be configured to analyse the sequences of sensor data elements to identify therein units. Each segment may O comprise slices of the seguences of sensor data elements, the slices being time-aligned, that N is, obtained at the same time from the respective sensors.
[0036] [0036] In case of motion, one way to segment the sensor data is to try to construct a relative trajectory of the sensor device. One way to estimate this trajectory is to double integrate the x-, y-, and z-components of acceleration sensor outputs. In this process one may remove gravity induced biases. Mathematically this can be done by calculating the baseline of each output. One way is to filter the data as in the next equation.
[0037] [0037] acc i baseline = acc i baseline + coeff a * (acc 1 — acc i baseline)
[0038] [0038] Acc above refers to the acceleration measurement and 1 refers to its components x, y, and z. These filtered values can be subtracted from the actual measurements: acc i without G = acc i — acc i baseline. This is a rough estimate of the — true linear acceleration, but still a fast and robust way to estimate it. The integration of these linear acceleration values leads to the estimate of the velocity of the sensor device in three-dimensional, 3D, space. The velocity components have biases due the incomplete linear acceleration estimate. These biases may be removed like in the previous equation:
[0039] [0039] v i baseline =v 1 baseline + coeff v*(vi-vi baseline) — [0040] V above refers to the velocity estimate and I refers to its components x, y, and z. These velocity components are not true velocities of the sensor device, but easily and robustly calculated estimates of them. The baseline components may be subtracted from the velocity estimates before integration: v i wo bias=vi-vi baseline. Since the method so far is incomplete, the integrals of the velocity components produce biased — position estimates p x, p y, and p z. Therefore these biases needs to be removed like in the previous eguations: O [0041] pi baseline=p i baseline + coeff p*(pi-pi baseline) & a [0042] P above refers to the position estimate and i refers to its components. Since W this procedure effectively produces 0-mean values, the natural reference of position is = 25 p x ref=0,p y ref =0, and p z ref = 0. The Euclidean distances of the measured values a sgrt(p x ti**2 + p y ti**2 + p z ti**2) form a time series varying from 0 to some S maximum value. ti refers to the index in the time series. These maximum values can 3 detected easily. The moment in time of the maximum value starts and the next maximum N value end the segment (and starts the next segment). The detection of the maximum value — can be conditional i.e. the maximum value is accepted as a start/stop marker only when it exceeds a certain level.
[0043] [0043] Also, the above described procedure to calculate the relative trajectory can be more precise by utilizing the gyroscopes and using e.g. complementary filtering.
[0044] [0044] Other ways to segment the data, that is, derive the segments, may include fitting to a periodic model, using a suitably trained artificial neural network or using a separate segmenting signal provided over a radio or wire-line interface, for example. The segmenting signal may be correlated in time with the sequences of sensor data elements, to obtain the segments. A segmenting signal may be transmitted or provided by a video recognition system or pressure pad system, for example. Such a video recognition system may be configured to identify steps, for example. — [0045] Once the segments have been derived, each segment may be assigned a label. Assigning the label may comprise identifying the segment. The identification may comprise comparing the sensor data comprised in the segment to a library of reference segments, for example in a least-sguares sense, and selecting from the library of reference segments a reference segment which most resembles the segment to be labelled. The label — assigned to the segment will then be a label associated with the closest reference segment in the library of reference segments.
[0046] [0046] In some embodiments, a plurality of reference segment libraries is used, such that a first phase of the identification is selection of a reference segment library. For example, where two reference segment libraries are used, one of them could be used for — continuous activity types and a second one of them could be used for discontinuous activity types. The continuous activity type is selected where the seguences of sensor data elements reflect a repetitive action which repeats a great number of times, such as jogging, O walking, cycling or rowing. The discontinuous activity type is selected when the activity is N characterized by brief seguences of action which are separated from each other in time, for = 25 example the afore-mentioned triple jump, or pole vault, being examples. Once the - reference segment library is chosen, all the segments are labelled with labels from the s selected reference segment library. o S [0047] A benefit of first selecting a reference segment library is obtained in more 2 effective labelling, as there is a lower risk segments are assigned incorrect labels. This is N 30 — so, since the number of reference segments the sensor data segments are compared to is lower, increasing the chances a correct one is chosen.
[0048] [0048] Once the segments have been labelled, a syntax check may be made wherein it is assessed, if the sequence of labels makes sense. For example, if the sequence of labels is consistent with known activity types, the syntax check is passed. On the other hand, if the sequence of labels comprises labels which do not fit together, a syntax error may be generated. As an example, a sequence of jogging steps which comprises mixed therein a few paddling motions would generate a syntax error, since the user cannot really be jogging and paddling at the same time. In some embodiments, a syntax error may be resolved by removing from the sequence of labels the labels which do not fit in, in case they occur in the sequence of labels only rarely for example at a rate of less than 2%. — [0049] The reference segment libraries may comprise indications as to which labels fit together, to enable handling syntax error situations.
[0050] [0050] Different exercise primitives may be associated with different characteristic freguencies. For example, acceleration sensor data may reflect a higher characteristic freguency when the user has been running, as opposed to walking. Thus the labelling of the segments may be based, in some embodiments, at least partly, on deciding which reference segment has a characteristic freguency that most closely matches a characteristic freguency of a section of the seguence of sensor data elements under investigation. Alternatively or in addition, acceleration sensor data may be employed to determine a characteristic movement amplitude. — [0051] The reference segment libraries may comprise reference datasets that are multi-sensorial in nature in such a way, that each reference segment comprises data that may be compared to each sensor data type that is available. For example, where device 110 O is configured to compile a time series of acceleration and sound sensor data types, the N reference segments may comprise reference datasets each reference segment = 25 corresponding to a label, wherein each reference segment comprises data that may be = compared with the acceleration data and data that may be compared with the sound data, z for example. The determined label may be determined as the label that is associated with O the multi-sensorial reference segment that most closely matches the segment stored by 3 device 110, for example. Device 110 may comprise, for example, microphones and > 30 cameras. Furthermore a radio receiver may, in some cases, be configurable to measure electric or magnetic field properties. Device 110 may comprise a radio receiver, in general, where device 110 is furnished with a wireless communication capability.
[0052] [0052] An example of activity type identification by segmenting and labelling is swimming, wherein device 110 stores sequences of sensor data elements that comprise moisture sensor data elements and magnetic field sensor data elements. The moisture sensor data elements indicating presence of water would cause a water-sport reference segment library to be used. Swimming may involve elliptical movements of an arm, to which device 110 may be attached, which may be detectable as periodically varying magnetic field data. In other words, the direction of the Earth’s magnetic field may vary from the point of view of the magnetic field sensor in a periodic way in the time series. This would enable labelling the segments as, for example, breast-stroke swimming motions.
[0053] [0053] Overall, a determined, or derived, activity type may be considered an estimated activity type until the user has confirmed the determination is correct. In some embodiments, a few, for example two or three, most likely activity types may be presented to the user as estimated activity types for the user to choose the correct activity type from.
[0054] [0054] Where device 110 or a personal device assigns the labels, the sequence of labels may be transmitted to a network server, for example, for storage. Device 110, the personal device or the server may determine an overall activity type the user is engaged in, > based on the labels. This may be based on a library of reference label sequences, for S example. = 25 [0055] In general, device 110 or the personal device may receive a machine readable = instruction, such as an executable program or executable script, from the server or another E network entity. The machine readable instruction may be usable in determining activity O type from the seguence of labels, and/or in assigning the labels to sensor data segments. In 3 the latter case, the machine readable instruction may be referred to as a labelling > 30 instruction.
[0056] [0056] The process may adaptively learn, based on the machine readable instructions, how to more accurately assign labels and/or determine activity types. A servermay have access to information from a plurality of users, and high processing capability, and thus be more advantageously placed to update the machine-readable instructions than device 110, for example.
[0057] [0057] The machine readable instructions may be adapted by the server. For example, a user who first obtains a device 110 may initially be provided, responsive to messages sent from device 110, with machine readable instructions that reflect an average user population. Thereafter, as the user engages in activity sessions, the machine readable instructions may be adapted to more accurately reflect use by this particular user. For example, limb length may affect periodical properties of sensor data captured while the — user is swimming or running. To enable the adapting, the server may request sensor data from device 110, for example periodically, and compare sensor data so obtained to the machine readable instructions, to hone the instructions for future use with this particular user. Thus a beneficial effect is obtained in fewer incorrectly labelled segments, and more effective and accurate compression of the sensor data. — [0058] FIGURE 2A illustrates an example of plural seguences of sensor data elements. On the upper axis, 201, is illustrated a seguence of moisture sensor data elements 210 while the lower axis, 202, illustrates a time series 220 of deviation of magnetic north from an axis of device 110, that is, a seguence of magnetic sensor data elements.
[0059] [0059] The moisture seguence 210 displays an initial portion of low moisture, followed by a rapid increase of moisture that then remains at a relatively constant, elevated, level before beginning to decline, at a lower rate than the increase, as device 110 dries. oO D [0060] Magnetic deviation seguence 220 displays an initial, erratic seguence of 2 deviation changes owing to movement of the user as he operates a locker room lock, for A 25 example, followed by a period of approximately periodic movements, before an erratic = sequence begins once more. The wavelength of the periodically repeating motion has been > exaggerated in FIGURE 2 to render the illustration clearer.
[0062] [0062] FIGURE 2B illustrates a second example of plural sequences of sensor data elements. In FIGURE 2B, like numbering denotes like elements as in FIGURE 2A. Unlike in FIGURE 2A, not one but two activity sessions are determined in the time series of FIGURE 2B. Namely, a cycling session is determined to start at beginning point 207 and to end at point 203, when the swimming session begins. Thus the compound activity session may relate to triathlon, for example. In cycling, moisture remains low, and magnetic deviation changes only slowly, for example as the user cycles in a velodrome. The segments would thus comprise two segments between points 207 and 203, and three — segments between points 203 and 205. The sequence of labels could be {cycling, cycling, freestroke, freestroke, freestroke}. Again, the number of segments is dramatically reduced for the sake of clarity of illustration.
[0063] [0063] FIGURE 3 illustrates an example apparatus capable of supporting at least some embodiments of the present invention. Illustrated is device 300, which may — comprise, for example, device 110 of FIGURE 1. Comprised in device 300 is processor 310, which may comprise, for example, a single- or multi-core processor wherein a single- core processor comprises one processing core and a multi-core processor comprises more than one processing core. Processor 310 may comprise more than one processor. A 2 processing core may comprise, for example, a Cortex-A8 processing core designed by N 25 ARM Holdings or an Excavator processing core produced by Advanced Micro Devices "s Corporation. Processor 310 may comprise at least one Oualcomm Snapdragon and/or Intel = Atom processor. Processor 310 may comprise at least one application-specific integrated a > circuit, ASIC. Processor 310 may comprise at least one field-programmable gate array, S FPGA. Processor 310 may be means for performing method steps in device 300. Processor 2 30 310 may be configured, at least in part by computer instructions, to perform actions.N
[0064] [0064] Device 300 may comprise memory 320. Memory 320 may comprise random- access memory and/or permanent memory. Memory 320 may comprise at least one RAMchip. Memory 320 may comprise solid-state, magnetic, optical and/or holographic memory, for example. Memory 320 may be at least in part accessible to processor 310. Memory 320 may be at least in part comprised in processor 310. Memory 320 may be means for storing information. Memory 320 may comprise computer instructions that processor 310 is configured to execute. When computer instructions configured to cause processor 310 to perform certain actions are stored in memory 320, and device 300 overall is configured to run under the direction of processor 310 using computer instructions from memory 320, processor 310 and/or its at least one processing core may be considered to be configured to perform said certain actions. Memory 320 may be at least in part comprised in processor 310. Memory 320 may be at least in part external to device 300 but accessible to device 300.
[0065] [0065] Device 300 may comprise a transmitter 330. Device 300 may comprise a receiver 340. Transmitter 330 and receiver 340 may be configured to transmit and receive, respectively, information in accordance with at least one cellular or non-cellular standard. Transmitter 330 may comprise more than one transmitter. Receiver 340 may comprise more than one receiver. Transmitter 330 and/or receiver 340 may be configured to operate in accordance with global system for mobile communication, GSM, wideband code division multiple access, WCDMA, long term evolution, LTE, IS-95, wireless local area network, WLAN, Ethernet and/or worldwide interoperability for microwave access, WiMAX, standards, for example.
[0066] [0066] Device 300 may comprise a near-field communication, NFC, transceiver 350. NFC transceiver 350 may support at least one NFC technology, such as NFC, Bluetooth, > Wibree or similar technologies. N [0067] Device 300 may comprise user interface, UI, 360. UI 360 may comprise at = 25 least one of a display, a keyboard, a touchscreen, a vibrator arranged to signal to a user by = causing device 300 to vibrate, a speaker and a microphone. A user may be able to operate = device 300 via UI 360, for example to manage activity sessions. = [0068] Device 300 may comprise or be arranged to accept a user identity module > 370. User identity module 370 may comprise, for example, a subscriber identity module, N 30 — SIM, card installable in device 300. A user identity module 370 may comprise information identifying a subscription of a user of device 300. A user identity module 370 may comprise cryptographic information usable to verify the identity of a user of device 300and/or to facilitate encryption of communicated information and billing of the user of device 300 for communication effected via device 300.
[0069] [0069] Processor 310 may be furnished with a transmitter arranged to output information from processor 310, via electrical leads internal to device 300, to other devices comprised in device 300. Such a transmitter may comprise a serial bus transmitter arranged to, for example, output information via at least one electrical lead to memory 320 for storage therein. Alternatively to a serial bus, the transmitter may comprise a parallel bus transmitter. Likewise processor 310 may comprise a receiver arranged to receive information in processor 310, via electrical leads internal to device 300, from other devices comprised in device 300. Such a receiver may comprise a serial bus receiver arranged to, for example, receive information via at least one electrical lead from receiver 340 for processing in processor 310. Alternatively to a serial bus, the receiver may comprise a parallel bus receiver.
[0070] [0070] Device 300 may comprise further devices not illustrated in FIGURE 3. For — example, where device 300 comprises a smartphone, it may comprise at least one digital camera. Some devices 300 may comprise a back-facing camera and a front-facing camera, wherein the back-facing camera may be intended for digital photography and the front- facing camera for video telephony. Device 300 may comprise a fingerprint sensor arranged to authenticate, at least in part, a user of device 300. In some embodiments, device 300 — lacks at least one device described above. For example, some devices 300 may lack a NFC transceiver 350 and/or user identity module 370.
[0071] [0071] Processor 310, memory 320, transmitter 330, receiver 340, NFC transceiver O 350, UI 360 and/or user identity module 370 may be interconnected by electrical leads N internal to device 300 in a multitude of different ways. For example, each of the = 25 aforementioned devices may be separately connected to a master bus internal to device = 300, to allow for the devices to exchange information. However, as the skilled person will E appreciate, this is only one example and depending on the embodiment various ways of O interconnecting at least two of the aforementioned devices may be selected without 3 departing from the scope of the present invention.
[0073] [0073] Phase 410 may comprise one or more activity sessions of at least one activity type. Where multiple activity sessions are present, they may be of the same activity type or different activity types. The user need not, in at least some embodiments, indicate to device 110 that activity sessions are ongoing. During phase 410, device 110 may, but in some embodiments need not, identify activity types or sessions. The sequences of sensor data elements compiled during phase 410 may last 10 minutes or 2 hours, for example. As a specific example, the time series may last from the previous time sensor data was downloaded from device 110 to another device, such as, for example, personal computer PCI.
[0074] [0074] Further, in phase 410, device 110 segments the sequences of sensor data elements to plural sensor data segments, as described herein above. These segments are then assigned labels to obtain a conversion of the sequences of sensor data elements to a sequence of labels.
[0075] [0075] In phase 420, the sequence of labels is provided, at least partly, to server SRV. This phase may further comprise providing to server SRV optional activity and/or event reference data. The providing may proceed via base station 120, for example. The © sequence of labels may be encrypted en route to the server to protect the user’s privacy. & A [0076] In phase 430, server SRV may determine, based at least partly on the A 25 sequence of labels in the message of phase 420, an associated machine readable = instruction. The machine readable instruction may relate, for example, to improved * labelling of segments relating to activities related to the labels in the seguence of labels S received in server SRV from device 110 in phase 420. > I [0077] In phase 440 the machine readable instruction determined in phase 430 is provided to device 110, enabling, in phase 450, a more accurate labelling of segments of sensor data.
[0078] [0078] FIGURE 5 is a flow graph of a method in accordance with at least some embodiments of the present invention. The phases of the illustrated method may be performed in device 110, an auxiliary device or a personal computer, for example, or in a control device configured to control the functioning thereof, when implanted therein.
[0079] [0079] Phase 510 comprises storing plural seguences of sensor data elements. Phase 520 comprises deriving, from the plural seguences of sensor data elements, plural sensor data segments, each sensor data segment comprising time-aligned sensor data element sub- seguences from at least two of the seguences of sensor data elements. Finally, phase 530 comprises assigning a label to at least some of the sensor data segment based on the sensor — data elements comprised in the respective sensor data segment, to obtain a seguence of labels.
[0080] [0080] It is to be understood that the embodiments of the invention disclosed are not limited to the particular structures, process steps, or materials disclosed herein, but are extended to eguivalents thereof as would be recognized by those ordinarily skilled in the — relevant arts. It should also be understood that terminology employed herein is used for the purpose of describing particular embodiments only and is not intended to be limiting.
[0081] [0081] Reference throughout this specification to one embodiment or an embodiment means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the present invention. Thus, appearances of the phrases “in one embodiment” or “in an embodiment” in various places throughout this specification are not necessarily all referring to the same embodiment. Where reference is made to a numerical value using a term such as, for example, about or substantially, the exact numerical value is also disclosed. oO > [0082] As used herein, a plurality of items, structural elements, compositional N 25 elements, and/or materials may be presented in a common list for convenience. However, N these lists should be construed as though each member of the list is individually identified I as a separate and unique member. Thus, no individual member of such list should be > construed as a de facto equivalent of any other member of the same list solely based on S their presentation in a common group without indications to the contrary. In addition, 2 30 — various embodiments and example of the present invention may be referred to herein along N with alternatives for the various components thereof. It is understood that such embodiments, examples, and alternatives are not to be construed as de facto equivalents of one another, but are to be considered as separate and autonomous representations of thepresent invention.
[0083] [0083] Furthermore, the described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the preceding description, numerous specific details are provided, such as examples of lengths, widths, shapes, etc., to provide a thorough understanding of embodiments of the invention. One skilled in the relevant art will recognize, however, that the invention can be practiced without one or more of the specific details, or with other methods, components, materials, etc. In other instances, well-known structures, materials, or operations are not shown or described in detail to avoid obscuring aspects of the invention.
[0085] [0085] The verbs “to comprise” and “to include” are used in this document as open limitations that neither exclude nor reguire the existence of also un-recited features. The features recited in depending claims are mutually freely combinable unless otherwise explicitly stated. Furthermore, it is to be understood that the use of "a" or "an", that is, a — singular form, throughout this document does not exclude a plurality.
INDUSTRIAL APPLICABILITY > [0086] At least some embodiments of the present invention find industrial > application in facilitating analysis of sensor data.NT ACRONYMS LIST
N I 25 GPS Global Positioning System a O LTE Long Term Evolution 3 2 NFC Near-Field Communication
N WCDMA Wideband Code Division Multiple Access WiMAX worldwide interoperability for microwave access
WLAN Wireless local area network
REFERENCE SIGNS LIST Network Node Satellite Constellation 201, 202 Axes in FIGURE 2 203, 2 05, Activity session endpoints in FIGURE2 and FIGURE2B 207 210, 220 Sensor data time series in FIGUREs 2 and 2B 310-370 Structure illustrated in FIGURE 3 410-430 Phases of the method of FIGURE 4 510-530 Phases of the method of FIGURE 5 oON
N ~
T ja m o o | oO ©OON
权利要求:
Claims (17)
[1] 1. A personal multi-sensor apparatus comprising; — a memory configured to store plural sequences of sensor data elements, and — at least one processing core configured to: " derive, from the plural sequences of sensor data elements, plural sensor data segments, each sensor data segment comprising time- aligned sensor data element sub-sequences from at least two of the sequences of sensor data elements, and " assign a label to at least some of the sensor data segments based on the sensor data elements comprised in the respective sensor data segments, to obtain a sequence of labels, — wherein the apparatus is further configured to determine, based on the sequence of labels, an activity type a user has engaged in while the sequences of sensor data have been obtained.
[2] 2. The apparatus according to claim 1, wherein the apparatus is further configured to transmit the sequence of labels to a node in network.
[3] 3. The apparatus according to claim 1, wherein the apparatus is configured to receive, from a node in a network, a machine readable instruction, and to employ the machine readable instruction in determining the activity type. oO > 25 —
[4] 4. The apparatus according to claim 3, wherein the machine readable instruction comprises N at least one of the following: an executable program and an executable script. a z
[5] 5. The apparatus according to any of claims 1 — 4, wherein the apparatus is configured to > receive, from a network, at least one labelling instruction, and to employ the at least one S 30 — machine readable labelling instruction in the assigning of the label to each sensor data 2 segment.
[6] 6. The apparatus according to claim 5, wherein the machine readable labelling instruction comprises at least one of the following: an executable program and an executable script.
[7] 7. The apparatus according to any of claims 1 — 6, wherein each of the plural sequences of sensor data elements comprises sensor data elements originating in exactly one sensor.
[8] 8 The apparatus according to any of claims 1 — 7, wherein the plural sequences of sensor data elements comprise at least three sequences of sensor data elements.
[9] 9. The apparatus according to any of claims 1 — 8, wherein the plural sequences of sensor data elements comprise at least nine sequences of sensor data elements.
[10] 10. The apparatus according to any of claims 1 — 9, wherein the apparatus is configured to derive the plural sensor data segments using, at least in part, a suitably trained artificial neural network.
[11] 11. A method in a personal multisensor apparatus, comprising: — storing plural sequences of sensor data elements; — deriving, from the plural sequences of sensor data elements, plural sensor data segments, each sensor data segment comprising time-aligned sensor data element sub-sequences from at least two of the sequences of sensor data elements, — assigning a label to at least some of the sensor data segments based on the sensor data elements comprised in the respective sensor data segments, to obtain a sequence of labels, and — determining, based on the sequence of labels, an activity type a user has engaged in o while the seguences of sensor data have been obtained. > 25 N
[12] 12. The method according to claim 11, further comprising transmitting the sequence of N labels to a node in network. Tr a o
[13] 13. The method according to claim 11, further comprising receiving, from a node in a S 30 network, a machine readable instruction, and employing the machine readable instruction = in determining the activity type.
[14] 14. A server apparatus comprising:
— areceiver configured to receive a sequence of labels assigned based on sensor data elements, the sensor data elements not being comprised in the sequence of labels and the labels not comprising the sensor data elements, and — at least one processing core configured to: = determine, based on the sequence of labels, an activity type a user has engaged in while the sensor data elements have been obtained.
[15] 15. The server apparatus according to claim 14, wherein the server apparatus is configured to determine the activity type based on comparing the received sequence of labels with a — list of label sequences stored in the server apparatus, and by selecting an activity type which is associated with a sequence of labels in the list which matches the received sequence of labels.
[16] 16. A method in a server apparatus, comprising: — receiving a sequence of labels assigned based on sensor data elements, the sensor data elements not being comprised in the sequence of labels and the labels not comprising the sensor data elements, and — determining, based on the sequence of labels, an activity type a user has engaged in while the sensor data elements have been obtained.
[17] 17. A computer program configured to cause a method in accordance with at least one of claims 11 — 13 or 16 to be performed.
o EO 25
N
N
I Ao a o
MN
O
O
O
O
N
类似技术:
公开号 | 公开日 | 专利标题
Gjoreski et al.2018|The university of sussex-huawei locomotion and transportation dataset for multimodal analytics with mobile devices
US20150169659A1|2015-06-18|Method and system for generating user lifelog
US10433768B2|2019-10-08|Activity intensity level determination
US20170176213A1|2017-06-22|Sensor based context management
KR20140116481A|2014-10-02|Activity classification in a multi-axis activity monitor device
US8990011B2|2015-03-24|Determining user device's starting location
US20130080255A1|2013-03-28|Step detection and step length estimation
US20170172468A1|2017-06-22|Activity intensity level determination
CN102567502A|2012-07-11|Supplementing biometric identification with device identification
WO2011092639A1|2011-08-04|Systems, methods, and apparatuses for providing context-based navigation services
EP2835769A1|2015-02-11|Method, device and system for annotated capture of sensor data and crowd modelling of activities
US10264392B2|2019-04-16|Location and activity aware content delivery system
US20190142307A1|2019-05-16|Sensor data management
US20170281079A1|2017-10-05|Tracking caloric expenditure using sensor driven fingerprints
US20170172466A1|2017-06-22|Activity intensity level determination
CN110916673A|2020-03-27|Gait monitoring method and intelligent equipment
CN106454723A|2017-02-22|Mobile phone accelerometer based child custody method
FI20196079A1|2020-06-22|Sensor data management
EP3431002B1|2021-11-03|Rf based monitoring of user activity
CN109314837A|2019-02-05|The backfill of exercise route based on geographical location
US20170272902A1|2017-09-21|Handling sensor information
EP2751704B1|2018-03-07|Method and apparatus for determining environmental context utilizing features obtained by multiple radio receivers
GB2579998A|2020-07-08|Sensor Based context management
US20200135076A1|2020-04-30|Method for controlling a display
FI20206293A1|2021-07-01|Method for controlling a display
同族专利:
公开号 | 公开日
TWI729596B|2021-06-01|
GB2581014A|2020-08-05|
GB2581014B|2021-09-22|
CN111351524A|2020-06-30|
DE102019008548A1|2020-06-25|
GB201917731D0|2020-01-15|
TW202032327A|2020-09-01|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题

GB2425180B|2005-04-14|2009-03-18|Justin Pisani|Monitoring system|
JP5028751B2|2005-06-09|2012-09-19|ソニー株式会社|Action recognition device|
US8187182B2|2008-08-29|2012-05-29|Dp Technologies, Inc.|Sensor fusion for activity identification|
WO2010083562A1|2009-01-22|2010-07-29|National Ict Australia Limited|Activity detection|
WO2011105914A1|2010-02-24|2011-09-01|Ackland, Kerri Anne|Classification system and method|
JP5718465B2|2010-08-09|2015-05-13|ナイキ イノベイト シーブイ|Fitness monitoring method, apparatus, computer readable medium, and system using mobile devices|
US8917907B2|2011-02-28|2014-12-23|Seiko Epson Corporation|Continuous linear dynamic systems|
US20150119728A1|2011-12-02|2015-04-30|Fitlinxx, Inc.|Health monitor|
WO2014118767A1|2013-02-03|2014-08-07|Sensogo Ltd.|Classifying types of locomotion|
JP5803962B2|2013-03-22|2015-11-04|ソニー株式会社|Information processing apparatus, sensor apparatus, information processing system, and recording medium|
KR101500662B1|2013-10-18|2015-03-09|경희대학교 산학협력단|Apparatus and method for activity recognizing using mobile device|
CN104680046B|2013-11-29|2018-09-07|华为技术有限公司|A kind of User Activity recognition methods and device|
CN111035394A|2014-09-02|2020-04-21|苹果公司|Body activity and fitness monitor|
WO2017040319A1|2015-08-28|2017-03-09|Focus Ventures, Inc.|System and method for automatically time labeling repetitive data|
CN105242779B|2015-09-23|2018-09-04|歌尔股份有限公司|A kind of method and mobile intelligent terminal of identification user action|
US20170232294A1|2016-02-16|2017-08-17|SensorKit, Inc.|Systems and methods for using wearable sensors to determine user movements|
US9830516B1|2016-07-07|2017-11-28|Videoken, Inc.|Joint temporal segmentation and classification of user activities in egocentric videos|
法律状态:
优先权:
申请号 | 申请日 | 专利标题
US16/228,981|US20190142307A1|2015-12-21|2018-12-21|Sensor data management|
[返回顶部]